-
-
Notifications
You must be signed in to change notification settings - Fork 32.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[code-infra] Move Link to @mui/docs #40889
Conversation
Netlify deploy previewhttps://deploy-preview-40889--material-ui.netlify.app/ Bundle size report |
This reverts commit e5833b0.
Signed-off-by: Jan Potoms <2109932+Janpot@users.noreply.github.com>
@mui/code-infra I love the idea of isolating what's docs-infra and what's
docs into separate folders in https://github.com/mui/material-ui. For
example the objective with
https://github.com/mui/material-ui/tree/master/docs/public/static/docs-infra.
I think it rings with 2 ⬇️, having correct abstractions to know what impact
what.
However, when I look at
https://www.notion.so/mui-org/mui-docs-02a0476bcb2d4e298e8540fd2df5a137,
and @mui/docs, there are two things I'm missing:
Why work on @mui/docs now? As far as I understand the problem, we want
Base UI to have its repository:
1. The pragmatic thing sounds to me to reproduce what MUI X and Toolpad
did. We need to do it for MUI Connect/Design
kits as well. I don't expect this to be a lot of work, a week? Doing this
first sounds valuable, not creating abstraction until we experience the
duplication pain.
2. Then, the immediate problem we will have is code duplication. So we
should create correct API abstractions, to remove all the duplication. For
example to fix the static files cross-dependency issue.
3. Then, the highest priority would be to evolve the docs-infra, we have a
good chunk of performance issues, a Next.js App Router migration to do, a
theming differentiation to create, and a likely migration to a subdomain.
4. Then, the highest priority might be to isolate the npm dependencies,
though we have been living with it for 3 years now, I don't recall this
lead to a lot of problems.
### Why this direction with @mui/docs?
1. Is an npm package the right solution? As far I understand the problem,
this
https://github.com/mui/mui-x/blob/e50aae8ae87e63cf67a67b2da5ffad16d8cf9651/package.json#L89
is
not great because we have no isolation of the npm dependencies with yarn.
How far did the discussion on adding https://github.com/mui/material-ui as
git submodules go? I mean, reproducing Hashicorp setup:
https://github.com/hashicorp/vault/blob/main/website/scripts/website-start.sh#L30.
Why would this not be simpler? We wouldn't need to publish, we wouldn't
need to build, only import and have the transpilers do their magic.
2. Why not enforce the code to move in sync with a single npm package?
Assuming we can't use git submodules. Why have multiple npm packages, e.g.
@mui/docs? Having a single @mui/internal npm package and putting everything
inside force us to keep everything in sync, it's a forcing function to
prevent making large technical debt "debits". Large breaking changes would
be spotted sooner. Are we at a scale where we need to migrate code chunk by
chunk? I doubt it, if we were 100 engineers, it would make more sense to
me. I think that file should be the smallest level at which we support code
evolution, and the folder the smallest level at which we support code
segmentation, not npm packages. Need to migrate from React 17 to 18 on a
massive codebase? you install both and migrate one file at a time.
In practice, it could look like this:
```jsx
import { Bar } from ***@***.***/internal/docs-infra/foo/baz';
```
|
Not going to comment here on the why now (cc @mnajdova @michaldudak), but I can already comment on some of the points made.
We've made this exact argument when we built the Toolpad docs, and we felt the duplication pain. The pains we encountered are dependency management and collaborating while making breaking changes to the setup. The duplication is described in the notion doc.
We can do several more repositories like this, but there is always a risk that we further bifurcate the setup. We may end up with every team doing their own docs/code infra again. We already have certain differences between Toolpad and X that adds challenge to extracting the @mui/docs package. This will only increase the more teams adopt this pattern.
This is a problem of implicit dependencies as well. One way it can be solved is by making them explicit. Use an image loader in the build pipeline and add them in the
Migrations could become less challenging when there are strong contracts between the different repositories. The published packages are a stronger contract than randomly importing nested folders of a nested workspace of a github dependency.
You'll have to clarify what you mean with isolation. The main problems I see with dependencies in our current setup:
I've spent a considerable time dealing with these issues over the past 2 years. The issues always come unexpected, and when monorepo becomes unmergeable it affects everyone operating on this interface between core and the other repositories.
Because the magic becomes a whole lot less magic if all you're doing is generating static html from static markdown. it's a pure function that doesn't depend on environment. Our use case involves including highly dynamic content in our markdown (demos/playgrounds). This content is more than just md files, it includes javascript files and their dependencies. javascript that has to live partly in different places: partly in the docs folder and partly in the docs infra. package.json dependencies are a great way to create a strong contract between a javascript file and its dependencies. But that only works if you install them correctly.
yep, I'm in favor for larger packages as well. Though from an organizational perspective I think one single one is likely too many unrelated concerns together, I've always advocated for a |
On the weekly when this was discussed we agreed to do it this way - we even mentioned that each change will be tested both in X and Toolpad to make sure the abstractions are done right. This is what I remember, but probably @michaldudak can answer with more details. |
I am the supporter of multiple smaller packages here. Having several fine-grained packages scales better and allows us to move faster as an organisation. The current setup with the monorepo dependency can be thought of an equivalent of one big package dependency. Numerous times I could not test if introducing a change in one of the the shared utils works in the X repo, because some completely unrelated code hasn't been integrated yet and the latest master did not build cleanly. Having code in separate packages allows me to iterate on it without waiting on unrelated modules.
Because as we're introducing yet another repository, I would like to avoid generating more technical debt right from the start. As @Janpot pointed out, the current setup requires overhaul, but I can understand there's never a good time to make such significant changes. Introducing a new repository was a good motivation to finally take care of this. |
packages/mui-docs/package.json
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Considering the latest discussions, this should be renamed to @mui/internal-docs
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was planning to make renaming the package a separate effort, as the package already existed and this PR already requires updating things on downstream repositories.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Fair enough 👍
packages/mui-docs/src/Link/Link.tsx
Outdated
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Do you think it's worth having both the component file and a separate index file when there are no other files in a directory?
In i18n
everything is directly inside index.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I don't really see the point, except when it's a convention 🙂 . I've added the index file 👍
@Janpot OK, I can see this. We could say: we inverse steps 1 and 2 because we have enough "materials" to work with, MUI X and Toolpad are enough. With this in mind:
I'm not sure it's so much about 3b. If we solve the CSS file duplication by migrating to Next.js App Router, it looks 3a. If we solve this by having a script that copies the source, it looks 1.
Agree. What I see, is that we might be able to get 80% of the value by working on 1: creating a dedicated docs-infra folder to host that logic. We might not need to go as far as introducing an npm package yet.
Ok, so when I said "we have been living with it for 3 years now", I actually truly only experienced 1 of these 3 years. The difference might have been that I know about all the changes in core because I merged or reviewed them so when I saw them break x, it felt simple to fix.
I don't know, to me it feels like we would trade The downside seems to be that contributors of MUI X would have more to do than The upside seems to be no npm publish, but to be fair a continuous CI step that releases each npm version sounds enough, but something we have to maintain. Ok, not as much as a clear win then? The pain of community contributions frightens me. I contributed to repositories in the past where it was more than these two commands, pretty bad UX.
I think we should handle changes in the infra so that no synching is required, but small continuous updates work. I see one value in having
@michaldudak This is the value I see on my end with a single npm package: back pressure: As I understand the example of the pain: someone did too many debt debits, she/he changed an API without taking into account the bandwidth of the dependents to handle them. It makes us slow to handle this debt, it wouldn't go unnoticed, either we will learn it's too much and revert, to push through it and learn for the next time; Ok, so from all of this, as long as these are reflected on, 👍:
|
children?: React.ReactNode; | ||
} | ||
|
||
export function DocsProvider({ config, defaultUserLanguage, children }: DocsProviderProps) { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we use either default or named exports consistently? Here, the DocsProvider is exported by name, while Link has a default one.
I prefer the named ones - there's no need to reexport both *
and default
from the index file, and it looks better when importing multiple identifiers from a module (intellisense gives all options in one place)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You're right, didn't even think about it, just copied from the docs blindly 😄
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
All right, it looks good to me now!
@jan Yes, however, making it a package.json bring extra overhead. As I see this: we need to define each dependency. We need to resolve circular dependencies, we need to batch renovate dependencies updates. What we get in exchange seems slim. When I see feedback like https://news.ycombinator.com/item?id=15889691, I read that the right play is to diehard on being as close as possible to one codebase, each change being atomic and org-wide. I see more npm packages as a way to share sources than binaries.
I'm saying that we could use git submodules to solve the same problem that using npm packages solves today. But it comes with an extra complexity for contributors. It's no longer:
to get a working environment. With git submodule, the community/maintainers would need a third build step.
Disagree, it doesn't sound good enough, I think it should be for each commit, or at worst daily. We need to be able to do bisections when something goes wrong to identify the commit to revert.
This sounds great. The only downside I can think of is that when we use a GitHub repository as a dependency, we download the git history at the same time, so it's not so fast to install. But otherwise 👌
@michaldudak The goal of having one person full-time for code-infra & docs-infra in 2024 is to have time to go after larger initiatives and to own the whole org. So I would expect migrations to be mostly handled by the two people in the role. I guess that with more scale, there will be no other way than to request team help. In any case, the fundamental constraint I want to enforce is that most changes that docs-infra/code-infra are doing should be backward compatible. We WON'T (in the majority of the time) get this backward compatibility feature from using older npm versions but by keeping old API work until no longer depended on. Each time we don't update, we fail the primary objective to have a single codebase, we are out of sync with HEAD. |
You need to shape this submodules idea in a more detailed way. I really don't see the architecture you have in mind. Do you suggest we do: # ./.gitmodules
[submodule "monorepo"]
path = monorepo
url = https://github.com/mui/material-ui to create the following file structure with a nested workspace root? mui-x\
docs\
pages\
getting-started.tsx
_document.tsx
package.json
next.config.mjs
node_modules
monorepo
docs\
pages\
getting-started.tsx
_document.tsx
package.json
next.config.mjs
packages\
material\
package.json
package.json
packages\
x-data-grid\
package.json
x-charts\
package.json
package.json This setup is essentially the same as our current setup. All this does is replace
I think you are overlooking the following sentence: "- Blaze. Much like Bazel. This is the system that defines how to build various artifacts. All dependencies are explicit, meaning you can create a true dependency graph for any piece of code. This is super-important because of..." this touches the crux of the problem. There is no dependency graph in the submodules setup, nor is there in our current setup. The root of our problems is dependencies, not colocation of code files. The maintainability that comes from monorepos is atomic changes, to the individual files, but also to the dependency graph as a whole. |
@Janpot My assumption is that from there, we can configure part of /monorepo/ in the MUI X pnpm workspace, have pnpm install the monorepo dependencies.
My assumption is that with the submodule path, we would continue to create packages, but not consume them through an npm registry. |
Yes, great answer, thats what I assumed as well, I've been playing with this in Toolpad before. So you'd update the packages:
- 'packages/*'
- 'docs'
- 'test'
- 'monorepo/packages/*' right? So that all the core workspaces become X/Toolpad workspaces as well. Problems that I can think of that need to be anticipated or resolved:
I'm not rejecting this but for me:
Ok, so the concern really centers around the "publishing to npm registry". Does that mean we can move forward with this PR and have a |
@Janpot I don't get this one. Form the perspective of pnpm, it looks like a normal folder. But ok, maybe there will be bugs when doing a git pull.
Unless we have two separate top level folders. One for mui/material-ui packages, one for the infra.
This could help, e.g. I have VS Code configured to have all MUI code when searching/jumping to files.
We don't have to use the explicit workspace:* syntax. We could set a regular version dependency. If it's in the workspace ppm resolves it locally, if not it goes through npm
I'm not aware of how pnpm inject works. I don't see how depending on the weekly release is an option. For example, if I really have to make a breaking change to docs-infra, as soon as I merged my change, i should be able to go to MUI X to make the change on their repository (or get the buy-in from someone in that team who support the change and is happy to make it happen)
Yes, concerns are:
Yes, I'm happy with the conclusion of #40889 (comment) (I don't know if the git submodule is better, but to weigh the pros & cons) |
It looks like a folder that looks like a workspace root but isn't used as one, with a pnpm lock file and a pnpm-workspaces.yml. The folder containing nested packages that all were written with the expectation to be installed under a different workspace root. And which will try to install in a workspace root higher up, using a lock file higher than the first one directly encountered by traversing up. I can imagine a package manager getting confused.
Ok, I'm open to try out this setup, we can always move to publishing packages when we find that this doesn't scale well. @michaldudak thoughts? |
I don't see creating packages as friction. After the initial setup, authoring a package shouldn't be different than working in a subdirectory. The points raised as drawbacks ("we need to define each dependency. We need to resolve circular dependencies, we need to batch renovate dependencies updates") I see as advantages - we have a clearer dependency tree. I agree with @oliviertassinari that they should be released as soon as possible, preferably after each merged PR. There is a way to automate this process completely (along with changelog creation, with something like conventional commits that Lerna supports). Unfortunately, with our current setup, this is somewhat problematic (I haven't found a way to release just a subset of workspace packages with One big advantage of using packages, that I believe wasn't mentioned here is that they share code that's ready-to-use. We don't need to set aliasing, bulid orchestrators, etc. We simply use the built output of the package, and it works no matter if we call it from TS, JS, node or a web app. Additionally, this can save us CI credits we don't need to spend on building the package each time an unrelated piece of code changed. I'd like to point out that at the meetings during the last retreat, we collectively agreed to share code using npm packages, and if my memory serves me well, this decision was welcomed with enthusiasm by people who work with infra on a daily basis.
Having worked with submodules in the past, I'd advise not to go this way, especially in an open-source project. They require more discipline and are harder to grasp. |
This comment was marked as resolved.
This comment was marked as resolved.
Signed-off-by: Jan Potoms <2109932+Janpot@users.noreply.github.com>
Agree, most things are a tradeoff, it depends on what's valued. Reminds me https://youtu.be/0JBOSmRo8js?si=UqYCn7X7bAYL-Pvh&t=240 |
An alternative to git submodule: https://github.com/ingydotnet/git-subrepo, I have seen it used in https://github.com/ag-grid/ag-website-shared. |
Next step in https://www.notion.so/mui-org/mui-docs-02a0476bcb2d4e298e8540fd2df5a137
This creates a
DocsProvider
which intends to store things host specific things for docs components. For now it only contains thedocs/config
and theUserLanguageProvider
.i18n
to docs packageLink
to docs packagesupersedes #40856